to tell the truth
idiom
—used to say that one is stating what one really thinks
I didn't really like the movie, to tell the truth.
Love words? Need even more definitions?
Merriam-Webster unabridged
Share